Pytorch--Dropout笔记
dropout常常用于抑制过拟合,pytorch也提供了很方便的函数。但是经常不知道dropout的参数p
是什么意思。在TensorFlow中p
叫做keep_prob
,就一直以为pytorch中的p
应该就是保留节点数的比例,但是实验结果发现反了,实际上表示的是不保留节点数的比例。看下面的例子:
a = torch.randn(10,1)
>>> tensor([[ 0.0684],
[-0.2395],
[ 0.0785],
[-0.3815],
[-0.6080],
[-0.1690],
[ 1.0285],
[ 1.1213],
[ 0.5261],
[ 1.1664]])
- p=0.5
torch.nn.Dropout(0.5)(a)
>>> tensor([[ 0.0000],
[-0.0000],
[ 0.0000],
[-0.7631],
[-0.0000],
[-0.0000],
[ 0.0000],
[ 0.0000],
[ 1.0521],
[ 2.3328]])
- p=0
torch.nn.Dropout(0)(a)
>>> tensor([[ 0.0684],
[-0.2395],
[ 0.0785],
[-0.3815],
[-0.6080],
[-0.1690],
[ 1.0285],
[ 1.1213],
[ 0.5261],
[ 1.1664]])
- p=1
torch.nn.Dropout(0)(a)
>>> tensor([[0.],
[-0.],
[0.],
[-0.],
[-0.],
[-0.],
[0.],
[0.],
[0.],
[0.]])